Home |
| Latest | About | Random
# 40 Miscellaneous remarks on eigenvalues, eigenvectors, and diagonalization. I would like to record some results on eigenvalues, eigenvectors, and diagonalization that we didn't have time to discuss too deeply, but you can certainly use all the results here and should know them. ## Miscellaneous facts. > If an $n\times n$ matrix $A$ has $n$ distinct eigenvalues, then $A$ is diagonalizable. Warning, the converse is not always true! A diagonalizable matrix can have repeated eigenvalues . > An $n\times n$ matrix $A$ is invertible if and only if $A$ does not have $0$ as an eigenvalue. > If an $n\times n$ matrix $A$ is upper or lower triangular, then the eigenvalues of $A$ are just the entries on the diagonal of $A$. > If an $n\times n$ matrix $A$ has $\lambda$ and $\mu$ as two different eigenvalues $\lambda\neq \mu$, then eigenvectors with eigenvalue $\lambda$ are linearly independent from eigenvectors of eigenvalue $\mu$. > If an $n\times n$ matrix $A$ has eigenvalues $\lambda_{1}, \lambda_{2},\ldots,\lambda_{n}$, then > (1) $\det(A) = \lambda_{1}\lambda_{2}\cdots\lambda_{n}$ > (2) $\text{tr}(A) = \lambda_{1} + \lambda_{2}+\cdots +\lambda_{n}$ > If an $n\times n$ matrix $A$ has $\lambda$ as eigenvalue, then $$ 1\le \text{gm}(\lambda) \le \text{am}(\lambda)\le n. $$ > **Cayley-Hamilton.** > If $A$ is an $n\times n$ matrix, and $p_{A}(t)$ its characteristic polynomial, then substituting the matrix $A$ into the polynomial always gives the zero matrix, that is, $$ p_{A}(A) = O_{n\times n}. $$ In general, it can be tedious or difficult to find eigenvalues, eigenvectors, or decide diagonalizability for an arbitrary matrix. However, there are some special scenarios where we know what would happen. I like to mention two important such cases. ## Perron-Frobenius theorem. > **Perron-Frobenius theorem.** > Let $A$ be an $n\times n$ matrix where each entry $a_{ij} > 0$ is strictly positive. Then $A$ has a positive eigenvalue $r > 0$ such that every other eigenvalue $\lambda$ of $A$ has magnitude strictly smaller than $r$, that is, $|\lambda| < r$. Further more, this eigenvalue $r$ has algebraic multiplicity of $1$, and hence geometric multiplicity of $1$. And even further more, an eigenvector $\vec v$ of $r$ can be chosen such that every entry of $\vec v$ is strictly positive, and there are no other such eigenvectors with all positive entries unless it is from the eigenspace $E_{r}$. We sometimes call this eigenvalue $r$ the **Perron root** of $A$, and the strictly positive eigenvector $\vec v \in E_{r}$ as a **Perron vector**. Matrices with all positive entries are surprisingly common. For example in the stochastic matrices in a discrete dynamical system problem, the stochastic matrices $M$ often have all entries that are positive. In this case, the eigenvalue $1$ is in fact the Perron root of $M$, and the equilibrium solution vector $\vec p_{\text{eq}}$ where $M\vec p_{\text{eq}} =\vec p_{\text{eq}}$ is a Perron vector! ## Real spectral theorem. > **Real spectral theorem.** > Let $A$ be an $n\times n$ real symmetric matrix. Then $A$ is orthogonally diagonalizable over reals. That is, it is diagonalizable and the eigenbasis can be chosen in a way that they are all perpendicular to each other. This is a profound result. First, this shows anytime you see a real symmetry matrix $A$, where $A=A^{T}$, then automatically it is diagonalizable over the reals, and that we have an orthogonal eigenbasis. Real symmetric matrices shows up often in physics or any system that describes interaction of objects that have symmetric interactions with each other.